Moment inequalities for sums of certain independent symmetric random variables
نویسنده
چکیده
This paper gives upper and lower bounds for moments of sums of independent random variables (Xk) which satisfy the condition that P (|X|k ≥ t) = exp(−Nk(t)), where Nk are concave functions. As a consequence we obtain precise information about the tail probabilities of linear combinations of independent random variables for which N(t) = |t| for some fixed 0 < r ≤ 1. This complements work of Gluskin and Kwapień who have done the same for convex functions N .
منابع مشابه
Complete Convergence and Some Maximal Inequalities for Weighted Sums of Random Variables
Let be a sequence of arbitrary random variables with and , for every and be an array of real numbers. We will obtain two maximal inequalities for partial sums and weighted sums of random variables and also, we will prove complete convergence for weighted sums , under some conditions on and sequence .
متن کاملSOME PROBABILISTIC INEQUALITIES FOR FUZZY RANDOM VARIABLES
In this paper, the concepts of positive dependence and linearlypositive quadrant dependence are introduced for fuzzy random variables. Also,an inequality is obtained for partial sums of linearly positive quadrant depen-dent fuzzy random variables. Moreover, a weak law of large numbers is estab-lished for linearly positive quadrant dependent fuzzy random variables. Weextend some well known inequ...
متن کاملMoment inequalities for functions of independent random variables
A general method for obtaining moment inequalities for functions of independent random variables is presented. It is a generalization of the entropy method which has been used to derive concentration inequalities for such functions [7], and is based on a generalized tensorization inequality due to Lata la and Oleszkiewicz [25]. The new inequalities prove to be a versatile tool in a wide range o...
متن کاملMoment Inequalities for Functions of Independent Random Variables by Stéphane Boucheron,1 Olivier Bousquet,
A general method for obtaining moment inequalities for functions of independent random variables is presented. It is a generalization of the entropy method which has been used to derive concentration inequalities for such functions [Boucheron, Lugosi and Massart Ann. Probab. 31 (2003) 1583–1614], and is based on a generalized tensorization inequality due to Latała and Oleszkiewicz [Lecture Note...
متن کاملLower bounds for tails of sums of independent symmetric random variables
The approach of Kleitman (1970) and Kanter (1976) to multivariate concentration function inequalities is generalized in order to obtain for deviation probabilities of sums of independent symmetric random variables a lower bound depending only on deviation probabilities of the terms of the sum. This bound is optimal up to discretization effects, improves on a result of Nagaev (2001), and complem...
متن کامل